Bayesian Error Based Sequences of Mutual Information Bounds

نویسنده

  • Sudhakar Prasad
چکیده

The inverse relation between mutual information (MI) and Bayesian error is sharpened by deriving finite sequences of upper and lower bounds on MI in terms of the minimum probability of error (MPE) and related Bayesian quantities. The well known Fano upper bound and Feder-Merhav lower bound on equivocation are tightened by including a succession of posterior probabilities starting at the largest, which directly controls the MPE, and proceeding to successively lower ones. A number of other interesting results are also derived, including a sequence of upper bounds on the MPE in terms of a previously introduced sequence of generalized posterior distributions. The tightness of the various bounds is illustrated for a simple application of joint spatial localization and spectral typing of a point source.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Unequal Error Protection Technique Based on the Mutual Information of the MPEG-4 Video Frames over Wireless Networks

The performance of video transmission over wireless channels is limited by the channel noise. Thus many error resilience tools have been incorporated into the MPEG-4 video compression method. In addition to these tools, the unequal error protection (UEP) technique has been proposed to protect the different parts in an MPEG-4 video packet with different channel coding rates based on the rate...

متن کامل

Generalization of Information Measures

| General formulas for entropy, mutual information, and divergence are established. It is revealed that these quantities are actually determined by three decisive sequences of random variables; which are, respectively, the normalized source information density, the normalized channel information density, and the normalized log-likelihood ratio. In terms of the ultimate cumulative distribution f...

متن کامل

Clustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information

Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...

متن کامل

Mutual Information and Conditional Mean Prediction Error

Mutual information is fundamentally important for measuring statistical dependence between variables and for quantifying information transfer by signaling and communication mechanisms. It can, however, be challenging to evaluate for physical models of such mechanisms and to estimate reliably from data. Furthermore, its relationship to better known statistical procedures is still poorly understo...

متن کامل

General Bounds for Predictive Errors in Supervised Learning

Within a Bayesian framework, we calculate general upper and lower bounds for a cumulative entropic error, which measures the success in the supervised learning of an unknown rule from examples. This performance measure is equivalent to the mutual information between the data and the parameter that specifies the rule to be learnt. Both bounds match asymptotically, when the number m of observed d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1409.6654  شماره 

صفحات  -

تاریخ انتشار 2014